69 research outputs found
Global Sensitivity Analysis of Stochastic Computer Models with joint metamodels
The global sensitivity analysis method, used to quantify the influence of
uncertain input variables on the response variability of a numerical model, is
applicable to deterministic computer code (for which the same set of input
variables gives always the same output value). This paper proposes a global
sensitivity analysis methodology for stochastic computer code (having a
variability induced by some uncontrollable variables). The framework of the
joint modeling of the mean and dispersion of heteroscedastic data is used. To
deal with the complexity of computer experiment outputs, non parametric joint
models (based on Generalized Additive Models and Gaussian processes) are
discussed. The relevance of these new models is analyzed in terms of the
obtained variance-based sensitivity indices with two case studies. Results show
that the joint modeling approach leads accurate sensitivity index estimations
even when clear heteroscedasticity is present
Sensitivity analysis with dependence and variance-based measures for spatio-temporal numerical simulators
International audienceIn a case of radioactive release in the environment, modeling the radionuclide atmospheric dispersion is particularly useful for emergency response procedures and risk assessment. For this, the CEA has developed a numerical simulator, called Ceres-Mithra, to predict spatial maps of radionuclide concentrations at different instants. This computer code depends on many uncertain scalar and temporal parameters, describing the radionuclide, release or weather characteristics. The purpose is to detect the input parameters the uncertainties of which highly affect the predicted concentrations and to quantify their influences. To this end, we present various measures for the sensitivity analysis of a spatial model. Some of them lead to as many analyses as spatial locations (site sensitivity indices) while others consider a single one, with respect to the whole spatial domain (block sensitivity indices). For both categories, variance-based and dependence measures are considered, based on recent literature. All of these sensitivity measures are applied to the CM computer code and compared to each other, showing the complementarity of block and site sensitivity analyses. Finally, a sensitivity analysis summarizing the input uncertainty contribution over the entirety of the spatio-temporal domain is proposed
New improvements in the use of dependence measures for sensitivity analysis and screening
International audiencePhysical phenomena are commonly modeled by numerical simulators. Such codes can take as input a high number of uncertain parameters and it is important to identify their influences via a global sensitivity analysis (GSA). However, these codes can be time consuming which prevents a GSA based on the classical Sobol' indices, requiring too many simulations. This is especially true as the number of inputs is important. To address this limitation, we consider recent advances in dependence measures, focusing on the distance correlation and the Hilbert-Schmidt independence criterion (HSIC). Our objective is to study these indices and use them for a screening purpose. Numerical tests reveal some differences between dependence measures and classical Sobol' indices, and preliminary answers to "What sensitivity indices to what situation?" are derived. Then, two approaches are proposed to use the dependence measures for a screening purpose. The first one directly uses these indices with independence tests; asymptotic tests and their spectral extensions exist and are detailed. For a higher accuracy in presence of small samples, we propose a non-asymptotic version based on bootstrap sampling. The second approach is based on a linear model associating two simulations, which explains their output difference as a weighed sum of their input differences. From this, a bootstrap method is proposed for the selection of the influential inputs. We also propose a heuristic approach for the calibration of the HSIC Lasso method. Numerical experiments are performed and show the potential of these approaches for screening when many inputs are not influential
An efficient methodology for modeling complex computer codes with Gaussian processes
Complex computer codes are often too time expensive to be directly used to
perform uncertainty propagation studies, global sensitivity analysis or to
solve optimization problems. A well known and widely used method to circumvent
this inconvenience consists in replacing the complex computer code by a reduced
model, called a metamodel, or a response surface that represents the computer
code and requires acceptable calculation time. One particular class of
metamodels is studied: the Gaussian process model that is characterized by its
mean and covariance functions. A specific estimation procedure is developed to
adjust a Gaussian process model in complex cases (non linear relations, highly
dispersed or discontinuous output, high dimensional input, inadequate sampling
designs, ...). The efficiency of this algorithm is compared to the efficiency
of other existing algorithms on an analytical test case. The proposed
methodology is also illustrated for the case of a complex hydrogeological
computer code, simulating radionuclide transport in groundwater
Global sensitivity analysis for models with spatially dependent outputs
International audienceThe global sensitivity analysis of a complex numerical model often calls for the estimation of variance-based importance measures, named Sobol' indices. Metamodel-based techniques have been developed in order to replace the cpu time-expensive computer code with an inexpensive mathematical function, which predicts the computer code output. The common metamodel-based sensitivity analysis methods are well-suited for computer codes with scalar outputs. However, in the environmental domain, as in many areas of application, the numerical model outputs are often spatial maps, which may also vary with time. In this paper, we introduce an innovative method to obtain a spatial map of Sobol' indices with a minimal number of numerical model computations. It is based upon the functional decomposition of the spatial output onto a wavelet basis and the metamodeling of the wavelet coefficients by the Gaussian process. An analytical example is presented to clarify the various steps of our methodology. This technique is then applied to a real hydrogeological case: for each model input variable, a spatial map of Sobol' indices is thus obtained
Uncertainty quantification for functional dependent random variables
International audienceThis paper proposes a new methodology to quantify the uncertainties associated to multiple dependent functional random variables, linked to a quantity of interest, called the covariate. The proposed methodology is composed of two main steps. First, the functional random variables are decomposed on a functional basis. The decomposition basis is computed by the proposed Simultaneous Partial Least Squares algorithm which enables to decompose simultaneously all the functional variables. Second, the joint probability density function of the coefficients of the decomposition associated to the functional variables is modelled by a Gaussian mixture model. A new method to estimate the parameters of the Gaussian mixture model based on a Lasso penalization algorithm is proposed. This algorithm enables to estimate sparse covariance matrices, in order to reduce the number of model parameters to be estimated. Several criteria are proposed to assess the efficiency of the methodology. Finally, its performance is shown on an analytical example and on a nuclear reliability test case
Calculations of Sobol indices for the Gaussian process metamodel
Global sensitivity analysis of complex numerical models can be performed by
calculating variance-based importance measures of the input variables, such as
the Sobol indices. However, these techniques, requiring a large number of model
evaluations, are often unacceptable for time expensive computer codes. A well
known and widely used decision consists in replacing the computer code by a
metamodel, predicting the model responses with a negligible computation time
and rending straightforward the estimation of Sobol indices. In this paper, we
discuss about the Gaussian process model which gives analytical expressions of
Sobol indices. Two approaches are studied to compute the Sobol indices: the
first based on the predictor of the Gaussian process model and the second based
on the global stochastic process model. Comparisons between the two estimates,
made on analytical examples, show the superiority of the second approach in
terms of convergence and robustness. Moreover, the second approach allows to
integrate the modeling error of the Gaussian process model by directly giving
some confidence intervals on the Sobol indices. These techniques are finally
applied to a real case of hydrogeological modeling
Global sensitivity analysis of stochastic computer models with joint metamodels
The global sensitivity analysis method used to quantify the influence of uncertain input variables on the variability in numerical model responses has already been applied to deterministic computer codes; deterministic means here that the same set of input variables gives always the same output value. This paper proposes a global sensitivity analysis methodology for stochastic computer codes, for which the result of each code run is itself random. The framework of the joint modeling of the mean and dispersion of heteroscedastic data is used. To deal with the complexity of computer experiment outputs, nonparametric joint models are discussed and a new Gaussian process-based joint model is proposed. The relevance of these models is analyzed based upon two case studies. Results show that the joint modeling approach yields accurate sensitivity index estimatiors even when heteroscedasticity is strong
- …